16 research outputs found

    Gaze Awareness in Computer-Mediated Collaborative Physical Tasks

    Get PDF
    Human eyes play an important role in everyday social interactions. However, the cues provided by eye movements are often missing or difficult to interpret in computer-mediated remote collaboration. Motivated by the increasing availability of gaze-tracking devices in the consumer market and the growing need for improved remotecollaboration systems, this thesis evaluated the value of gaze awareness in a number of video-based remote-collaboration situations. This thesis comprises six publications which enhance our understanding of the everyday use of gaze-tracking technology and the value of shared gaze to remote collaborations in the physical world. The studies focused on a variety of collaborative scenarios involving different camera configurations (stationary, handheld, and head-mounted cameras), display setups (screen-based and projection displays), mobility requirements (stationary and mobile tasks), and task characteristics (pointing and procedural tasks). The aim was to understand the costs and benefits of shared gaze in video-based collaborative physical tasks. The findings suggest that gaze awareness is useful in remote collaboration for physical tasks. Shared gaze enables efficient communication of spatial information, helps viewers to predict task-relevant intentions, and enables improved situational awareness. However, different contextual factors can influence the utility of shared gaze. Shared gaze was more useful when the collaborative task involved communicating pointing information instead of procedural information, the collaborators were mutually aware of the shared gaze, and the quality of gaze-tracking was accurate enough to meet the task requirements. In addition, the results suggest that the collaborators’ roles can also affect the perceived utility of shared gaze. Methodologically, this thesis sets a precedent in shared gaze research by reporting the objective gaze data quality achieved in the studies and also provides tools for other researchers to objectively view gaze data quality in different research phases. The findings of this thesis can contribute towards designing future remote-collaboration systems; towards the vision of pervasive gaze-based interaction; and towards improved validity, repeatability, and comparability of research involving gaze trackers

    Mobile gaze interaction : gaze gestures with haptic feedback

    Get PDF
    There has been an increasing need for alternate interaction techniques to support mobile usage context. Gaze tracking technology is anticipated to soon appear in commercial mobile devices. There are two important considerations when designing mobile gaze interactions. Firstly, the interaction should be robust to accuracy problems. Secondly, user feedback should be instantaneous, meaningful and appropriate to ease the interaction. This thesis proposes gaze gesture input with haptic feedback as an interaction technique in the mobile context. This work presents the results of an experiment that was conducted to understand the effectiveness of vibrotactile feedback in two stroke gaze gesture based mobile interaction and to find the best temporal point in terms of gesture progression to provide the feedback. Four feedback conditions were used, NO (no tactile feedback), OUT (tactile feedback at the end of first stroke), FULL (tactile feedback at the end of second stroke) and BOTH (tactile feedback at the end of first and second strokes). The results suggest that haptic feedback does help the interaction. The participants completed the tasks with fewer errors when haptic feedback was provided. The feedback conditions OUT and BOTH were found to be equally effective in terms of task completion time. The participants also subjectively rated these feedback conditions as being more comfortable and easier to use than FULL and NO feedback conditions

    Haptic feedback in eye typing

    Get PDF
    Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedback; both were easy to perceive and participants liked both the auditory ”click” and the tactile “tap” of the selected key. Implementation details (such as the placement of the haptic actuator) were also found important

    Toward a Real-Time Index of Pupillary Activity as an Indicator of Cognitive Load

    Get PDF
    The Low/High Index of Pupillary Activity (LHIPA), an eye-tracked measure of pupil diameter oscillation, is redesigned and implemented to function in real-time. The novel Real-time IPA (RIPA) is shown to discriminate cognitive load in re-streamed data from earlier experiments. Rationale for the RIPA is tied to the functioning of the human autonomic nervous system yielding a hybrid measure based on the ratio of Low/High frequencies of pupil oscillation. The paper\u27s contribution is drawn from provision of documentation of the calculation of the RIPA. As with the LHIPA, it is possible for researchers to apply this metric to their own experiments where a measure of cognitive load is of interest

    Gaze Awareness in Computer-Mediated Collaborative Physical Tasks

    Get PDF
    Human eyes play an important role in everyday social interactions. However, the cues provided by eye movements are often missing or difficult to interpret in computer-mediated remote collaboration. Motivated by the increasing availability of gaze-tracking devices in the consumer market and the growing need for improved remotecollaboration systems, this thesis evaluated the value of gaze awareness in a number of video-based remote-collaboration situations. This thesis comprises six publications which enhance our understanding of the everyday use of gaze-tracking technology and the value of shared gaze to remote collaborations in the physical world. The studies focused on a variety of collaborative scenarios involving different camera configurations (stationary, handheld, and head-mounted cameras), display setups (screen-based and projection displays), mobility requirements (stationary and mobile tasks), and task characteristics (pointing and procedural tasks). The aim was to understand the costs and benefits of shared gaze in video-based collaborative physical tasks. The findings suggest that gaze awareness is useful in remote collaboration for physical tasks. Shared gaze enables efficient communication of spatial information, helps viewers to predict task-relevant intentions, and enables improved situational awareness. However, different contextual factors can influence the utility of shared gaze. Shared gaze was more useful when the collaborative task involved communicating pointing information instead of procedural information, the collaborators were mutually aware of the shared gaze, and the quality of gaze-tracking was accurate enough to meet the task requirements. In addition, the results suggest that the collaborators’ roles can also affect the perceived utility of shared gaze. Methodologically, this thesis sets a precedent in shared gaze research by reporting the objective gaze data quality achieved in the studies and also provides tools for other researchers to objectively view gaze data quality in different research phases. The findings of this thesis can contribute towards designing future remote-collaboration systems; towards the vision of pervasive gaze-based interaction; and towards improved validity, repeatability, and comparability of research involving gaze trackers

    Mobile gaze interaction : gaze gestures with haptic feedback

    Get PDF
    There has been an increasing need for alternate interaction techniques to support mobile usage context. Gaze tracking technology is anticipated to soon appear in commercial mobile devices. There are two important considerations when designing mobile gaze interactions. Firstly, the interaction should be robust to accuracy problems. Secondly, user feedback should be instantaneous, meaningful and appropriate to ease the interaction. This thesis proposes gaze gesture input with haptic feedback as an interaction technique in the mobile context. This work presents the results of an experiment that was conducted to understand the effectiveness of vibrotactile feedback in two stroke gaze gesture based mobile interaction and to find the best temporal point in terms of gesture progression to provide the feedback. Four feedback conditions were used, NO (no tactile feedback), OUT (tactile feedback at the end of first stroke), FULL (tactile feedback at the end of second stroke) and BOTH (tactile feedback at the end of first and second strokes). The results suggest that haptic feedback does help the interaction. The participants completed the tasks with fewer errors when haptic feedback was provided. The feedback conditions OUT and BOTH were found to be equally effective in terms of task completion time. The participants also subjectively rated these feedback conditions as being more comfortable and easier to use than FULL and NO feedback conditions

    Gaze-based Kinaesthetic Interaction for Virtual Reality

    Get PDF
    Kinaesthetic interaction using force-feedback devices is promising in virtual reality. However, the devices are currently not suitable for interactions within large virtual spaces because of their limited workspace. We developed a novel gaze-based kinaesthetic interface that employs the user’s gaze to relocate the device workspace. The workspace switches to a new location when the user pulls the mechanical arm of the device to its reset position and gazes at the new target. This design enables the robust relocating of device workspace, thus achieving an infinite interaction space, and simultaneously maintains a flexible hand-based kinaesthetic exploration. We compared the new interface with the scaling-based traditional interface in an experiment involving softness and smoothness discrimination. Our results showed that the gaze-based interface performs better than the traditional interface, in terms of efficiency and kinaesthetic perception. It improves the user experience for kinaesthetic interaction in virtual reality without increasing eye strain.acceptedVersionPeer reviewe

    Gaze Augmented Hand-Based Kinesthetic Interaction: What You See is What You Feel

    No full text

    Gaze Augmented Hand-Based Kinesthetic Interaction: What You See Is What You Feel

    Get PDF
    Kinesthetic interaction between the user and the computer mainly utilizes the hand-based input with force-feedback devices. There are two major shortcomings in hand-based kinesthetic interaction: physical fatigue associated with continuous hand movements and the limited workspace of current force-feedback devices for accurately exploring a large environment. To address these shortcomings, we developed two interaction techniques that use eye gaze as an additional input modality: HandGazeTouch and GazeTouch. Hand GazeTouch combines eye gaze and hand motion as the input for kinesthetic interaction, i.e., it uses eye gaze to point and hand motion to touch. GazeTouch replaces all hand motions in touch behavior with eye gaze, i.e., it uses eye gaze to point and gaze dwell time to trigger the touch. In both interaction techniques, the user feels the haptic feedback through the force-feedback device. The gaze-based techniques were evaluated in a softness discrimination experiment by comparing them to the traditional kinesthetic interface, HandTouch, which only uses the hand-based input. The results indicate that the HandGazeTouch technique is not only as accurate, natural, and pleasant as the traditional interface but also more efficient.acceptedVersionPeer reviewe
    corecore